Adaptive KNN Classification Based on Laplacian Eigenmaps and Kernel Mixtures

نویسنده

  • Renqiang Min
چکیده

K Nearest Neighbor (kNN) is one of the most popular machine learning techniques, but it often fails to work well with inappropriate choice of distance metric or due to the presence of a lot of irrelated features. Linear and non-linear feature transformation methods have been applied to extract classrelevant information to improve kNN classification. In this paper, I describe kNN classification in a large-margin framework, in which a non-linear feature mapping is sought through Laplacian eigenmaps or kernel mixtures, and then a linear transformation matrix is learned to achieve the goal of large margin.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Kernel Laplacian Eigenmaps for Visualization of Non-vectorial Data

In this paper, we propose the Kernel Laplacian Eigenmaps for nonlinear dimensionality reduction. This method can be extended to any structured input beyond the usual vectorial data, enabling the visualization of a wider range of data in low dimension once suitable kernels are defined. Comparison with related methods based on MNIST handwritten digits data set supported the claim of our approach....

متن کامل

Laplacian Spectrum Learning

The eigenspectrum of a graph Laplacian encodes smoothness information over the graph. A natural approach to learning involves transforming the spectrum of a graph Laplacian to obtain a kernel. While manual exploration of the spectrum is conceivable, non-parametric learning methods that adjust the Laplacian’s spectrum promise better performance. For instance, adjusting the graph Laplacian using ...

متن کامل

Spectral Clustering and Kernel PCA are Learning Eigenfunctions

In this paper, we show a direct equivalence between spectral clustering and kernel PCA, and how both are special cases of a more general learning problem, that of learning the principal eigenfunctions of a kernel, when the functions are from a function space whose scalar product is defined with respect to a density model. This defines a natural mapping for new data points, for methods that only...

متن کامل

Stratified Structure of Laplacian Eigenmaps Embedding

We construct a locality preserving weight matrix for Laplacian eigenmaps algorithm used in dimension reduction. Our point cloud data is sampled from a low dimensional stratified space embedded in a higher dimension. Specifically, we use tools developed in local homology, persistence homology for kernel and cokernels to infer a weight matrix which captures neighborhood relations among points in ...

متن کامل

Discussion of "Spectral Dimensionality Reduction via Maximum Entropy"

Since the introduction of LLE (Roweis and Saul, 2000) and Isomap (Tenenbaum et al., 2000), a large number of non-linear dimensionality reduction techniques (manifold learners) have been proposed. Many of these non-linear techniques can be viewed as instantiations of Kernel PCA; they employ a cleverly designed kernel matrix that preserves local data structure in the “feature space” (Bengio et al...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009